Amazon facial recognition technology found to be gender and racial bias

Amazon facial recognition technology - algorithim - computer - tech

An MIT lab study has revealed Amazon’s Rekognition underperformed when trying to identify females or individuals with darker skin.

Amazon facial recognition technology, Rekognition, is widely used by police departments and Immigration and Customs Enforcement (ICE) across the US. However, recent research has revealed that the deep-learning software, which has emerged as a leader in the facial recognition field, still has a few serious bugs to work out in regard to gender and racial bias.

Rekognition made no mistakes when identifying the gender of a light-skinned man.

According to a recent study published by MIT Media Lab, the Amazon facial recognition technology had no trouble identifying the gender of a light-skinned man. However, in these same tests led by MIT’s Joy Buolamwini, Rekognition mistook women for men 19% of the time and mistook darker-skinned women for men 31% of the time.

This isn’t the first time Buolamwini has studied the performance of facial recognition technology, reports The Verge.

In February of 2018, Buolamwini identified similar gender and racial bias problems in IBM, Microsoft, and Megvii’s facial analysis software. Not long after Buolamwini shared her research results, both IBM and Microsoft promised to improve their software.

More specifically, IBM’s improvements included publishing a curated dataset that the company says would boost accuracy. Microsoft, on the other hand, has called for regulation of facial recognition technology to ensure the highest standards are being employed with this software.

MIT’s study isn’t the only time Amazon facial recognition technology has been found problematic.

As for Amazon’s Rekognition, the online retail giant and marketplace argues that the recent MIT research does not suggest anything about the accuracy of its technology. Amazon says that the researchers did not test the latest version of its Rekognition software. Furthermore, the company says that the gender identification test was facial analysis only and did not include facial identification.

The difference between the two is facial analysis spots expressions and characteristics like facial hair, while facial identification matches scanned faces to mugshots.

“It’s not possible to draw a conclusion on the accuracy of facial recognition for any use case — including law enforcement — based on results obtained using facial analysis,” general manager of deep learning and AI at Amazon Web Services, Matt Wood, stated in a press release.

That being said, in spite of the company defending its software, this latest MIT study isn’t the only one in which Amazon’s Rekognition performed poorly.

Last year, an ACLU test conducted on the Amazon facial recognition technology – specifically on Amazon facial recognition technology - algorithim - computer - techRekognition’s facial identification software – found that it falsely matched pictures of 28 individuals in congress with police mugshots.

At that time, Amazon said poor calibration of the algorithm was to blame for these results.

Leave a Comment


This site uses Akismet to reduce spam. Learn how your comment data is processed.